Current:Home > FinanceMobile apps fueling AI-generated nudes of young girls: Spanish police-DB Wealth Institute B2 Expert Reviews
Mobile apps fueling AI-generated nudes of young girls: Spanish police
View Date:2025-01-11 06:49:18
A town in Spain made international headlines after a number of young schoolgirls said they received fabricated nude images of themselves that were created using an easily accessible "undressing app" powered by artificial intelligence, raising a larger discussion about the harm these tools can cause.
"Today a smartphone can be considered as a weapon," Jose Ramon Paredes Parra, the father of a 14-year-old victim, told ABC News. "A weapon with a real potential of destruction and I don't want it to happen again."
Over 30 victims between the ages of 12 and 14 years of age have been identified so far, and an investigation has been ongoing since Sept. 18, Spanish National Police told ABC News.
And while most of the victims are from Almendralejo, a town in the southwest of Spain at the center of this controversy, Spanish National Police say they have also found victims in other parts of the country.
A group of male perpetrators, who police say knew most of the victims, used photos taken from the social media profiles of female victims and uploaded them to a nudify app, authorities told ABC News.
Nudify is a term used to describe an AI-powered tool designed to remove clothing from a subject in a photo. In this case, the service can be used via Telegram or via an app you download on your phone.
MORE: Sharing deepfake pornography could soon be illegal in America
These same perpetrators, also minors, created a group chat on WhatsApp and on Telegram to disseminate these non-consensual fabricated nude images, authorities told ABC News. The fake images were used to extort at least one victim on Instagram for real nude images or money, said the parent of one of the victims.
"This is a direct abuse of women and girls, by technology that is specifically designed to abuse women and girls," explained Professor Clare McGlynn, a law professor at Durham University in the U.K. and an expert on violence against women and girls.
Experts tell ABC News all it takes to make a hyper-realistic non-consensual deepfake is a photo, an email address and a few dollars if you want to create them in bulk.
Some of these nudify apps do not have safeguards regarding which images are being uploaded to be altered, meaning some of them are being used to create child sexual abuse material (CSAM.)
ABC News reviewed the nudify app Spanish authorities say was used to create these AI-generated explicit images of young girls. The app offers a free service that can be used through Telegram, as well as an app that you can download on your phone.
The app also offers a premium paid service that accepts payment methods such as Google Pay, Visa, Mastercard, and Paypal.
A Visa spokesperson told ABC News that it does not permit the use of their network to be used for illegal activity. "Visa rules require merchants to obtain consent from all persons depicted in any adult content, including computer-generated or computer-modified content, such as deepfakes," added a spokesperson.
ABC News reached out to the app in question but has not heard back. ABC News has also reached out to Visa, Mastercard, and Paypal.
Parra and his wife, Dr. Miriam Al Adib Mendiri, went directly to local police after they said their daughter confided in them that she had been targeted and they also decided that they would use Mendiri's large social following to denounce the crime publicly.
"Here we are united to STOP THIS NOW. Using other people's images to do this barbarity and spread them, is a very serious crime," Mendiri shared in an Instagram video. "[…] Girls, don't be afraid to report such acts. Tell your mothers."
Mendiri's public appeal led to many more victims coming forward to police. Local authorities say that some of the perpetrators are under 14 years old, meaning they will have to be tried under the minor criminal law. Investigations are ongoing, confirmed Spanish National Police.
"If they do not understand what they did now, if they don't realize it, what they will become later?" said Parra. "Maybe rapist, maybe gender violent perpetrator… they need to be educated and to change now."
Experts like McGlynn believe the focus should be on the global search platforms and the apps that facilitate the creation of non-consensual imagery.
"Google returns nudify websites at the top of its ranking, enabling, and legitimizing these behaviors," McGlynn said. "There is no legitimate reason to use nudify apps without consent. They should not be de-ranked by search platforms such as Google."
Another expert who founded a company to help individuals remove leaked private content online agreed with McGlynn.
"Apps that are designed to essentially unclothe unsuspecting women have zero place in our society, let alone search engines," said Dan Purcell, founder of Ceartas. "We are entering an endemic of kids using AI to undress kids, and everyone should be concerned and outraged."
This is not the first example of generative AI technology being used to inflict harm on girls and women. Just recently, in 2023, Nassau County officials in New York State charged a man who in 2019 fabricated sexually explicit deepfake images of more than a dozen underage women and shared them on a pornographic website.
"It's not just an AI problem. Society needs to face it now or it will be even worse later, this is just the beginning. It needs a proper regulation," Parra argued.
veryGood! (3188)
Related
- US Open finalist Taylor Fritz talks League of Legends, why he hated tennis and how he copied Sampras
- From 'Butt Fumble' to 'Hell Mary,' Jets can't outrun own misery in another late-season collapse
- Final trial over Elijah McClain’s death in suburban Denver spotlights paramedics’ role
- How Jonathan Bailey and Matt Bomer Bonded Over a Glass of Milk
- Wicked's Ethan Slater Shares How Ariana Grande and Cynthia Erivo Set the Tone on Set
- These Secrets About the Twilight Franchise Will Be Your Life Now
- Man pleads to 3rd-degree murder, gets 24 to 40 years in 2016 slaying of 81-year-old store owner
- From 'Butt Fumble' to 'Hell Mary,' Jets can't outrun own misery in another late-season collapse
- Rachael Ray Details Getting Bashed Over Decision to Not Have Kids
- Bryan Adams says Taylor Swift inspired him to rerecord: 'You realize you’re worth more'
Ranking
- Deion Sanders addresses trash thrown at team during Colorado's big win at Texas Tech
- Where to watch 'Rudolph the Red-Nosed Reindeer': TV channel, showtimes, streaming info
- Michigan-Ohio State: Wolverines outlast Buckeyes for third win in a row against rivals
- College football bold predictions for Week 13: Florida State's season spoiled?
- Mariah Carey's Amazon Holiday Merch Is All I Want for Christmas—and It's Selling Out Fast!
- How WWE's Gunther sees Roman Reigns' title defenses: 'Should be a very special occasion'
- Mississippi State football hires Jeff Lebby, Oklahoma offensive coordinator, as next coach
- Shania Twain makes performance debut in Middle East for F1 Abu Dhabi concert
Recommendation
-
1 monkey captured, 42 monkeys still on the loose after escaping research facility in SC
-
Dogs gone: Thieves break into LA pet shop, steal a dozen French bulldogs, valued at $100,000
-
Florida's Jamari Lyons ejected after spitting at Florida State's Keiondre Jones
-
How WWE's Gunther sees Roman Reigns' title defenses: 'Should be a very special occasion'
-
How Ben Affleck Really Feels About His and Jennifer Lopez’s Movie Gigli Today
-
1.3 million chickens to be culled after bird flu detected at Ohio farm
-
College football Week 13 grades: Complaining Dave Clawson, Kirk Ferentz are out of touch
-
Mississippi State football hires Jeff Lebby, Oklahoma offensive coordinator, as next coach